Activity Distinction using Android Sensors
Difficulty Level:
Tags other☁android☁opensignals mobile☁activity distinction

The OpenSignals mobile application allows for acquisition of biosignals using PLUX sensors. Additionally, the App enables you to acquire data from the internal sensors of the phone on which the application is running. On the other hand, the App also offers a simple and straightforward way to record data from a variety of sensors.

A possible use case for the OpenSignals mobile application would be, for example, to record data from patients or subjects while they are at home. This usage is particularly useful when patients/subjects are not able to come to the research facility due to either personal or governmental restrictions. Furthermore, it offers the possibility to record data while the patient/subject is in his/her natural daily environment.

When patients/subjects do recordings at home over longer periods of time, usually the patient/subject is asked to keep a daily journal to write down all the activities that have been performed during the day. These journals however, may not be absolutely accurate and thus do not offer enough time resolution to reliably distinguish the start and end times of activities in relation to the recorded signals.
To overcome these constraints, Android sensor data can be helpful to infer the activities that the patient/subject performed throughout the recording time. Using Android"s internal sensors this is a practical solution since it is not necessary to equip the patient/subject with additional sensors and the subject/patient usually carries the device throughout the day.

In this Jupyter Notebook we will show you that the data acquired from Android sensors can be leveraged to visually distinguish between four different activities, namely between:

  • sitting,
  • lying down,
  • standing, and
  • walking.

In case this is your first time working with Android sensors, we highly recommend reading this notebook which provides you all general information on Android sensors that you need to know.


1 - Equipment

To showcase how an exemplary recording of a patient"s/subject"s data could look like, not only data from internal Android sensors was recorded, but also from two PLUX sensors.

The following PLUX devices/sensors were used:

The cardioBAN is a wearable device that consists of a shoulder-chest harness strap with an integrated ECG sensor and a 3-axis accelerometer. The ECG uses three dry electrodes that are integrated during the fabrication of the chest strap. This wearable allows for a comfortable acquisition of cardiac signals and basic motion data in dynamic conditions.
The SpO2 (versatile) is a digital sensor that is designed for peripheral capillary oxygen saturation level estimation using two LED"s, one in the red region and the other in the infrared region of the spectrum. Its design allows to place the sensor on body regions that are not the traditional finger clip or ear lobe placements. The sensor was plugged into the digital port of the cardioBAN . In order to be able to read data from the digital sensor, the cardioBAN has to be updated to the newest firmware. In case you do not know how to achieve this, please consult the OpenSignals manual .

Android data was acquired using a Samsung Galaxy A40 . Data from the following sensors was recorded:

  • Accelerometer (ACC)
  • Gyroscope (GYR)
  • Magnetometer (MAG)

The choice of these three sensors is equivalent to using an Inertial Measurement Unit (IMU) . These units are commonly used for movement tracking and thanks to the Android sensor functionality within the OpenSignals mobile application we have such a powerful tool readily available at our disposal.

All sensors were set to a sampling rate of 100 Hz in order to facilitate the synchronisation process.

2 - Sensor placement

Throughout all recordings, the sensors and the phone were placed as follows:

  • CardioBAN: The shoulder-chest harness was strapped around the upper body, just below the chest in such a way that the three dry electrodes were placed underneath the left pectoralis major muscle (near the location of the heart). It was made sure, that the harness was strapped tightly enough to assure that all three electrodes were in contact with the skin. The accelerometer sensor of the cardioBAN was placed in the middle of the chest between the two pectoralis major muscles and fixed with adhesives. The sensor was placed in such a way that its y-axis was pointing upwards towards the head.

  • SpO2 (versatile): The sensor was placed on the left temple of the head. It was fixed with adhesives and additionally a headband was used to occlude the sensor from any external light sources. This placement was chosen because it provides good SpO2 readings and it does not hinder the person wearing the device in his/her movement.

  • Samsung Galaxy A40: The phone was always placed in the right front pocket of the trousers the subject was wearing. The phone was placed in such a way, the phone"s z-axis was pointing towards the leg and its y-axis was approximately parallel to the leg and pointing towards the knee.

The image below illustrates the placement of the sensors. On the left side, the placement of the PLUX sensors is shown. For illustration purposes the individual components, the ECG electrodes , the accelerometer , and the SpO2 sensors are highlighted in different colors. The image depicts the SpO2 placement without the headband that was used to occlude the sensor from external light sources. On the right side, the placement of the phone within the front pocket is displayed.

3 - Performed Activities and signal recording

As described at the beginning of this Jupyter Notebook four different activities were performed. All these activities were recorded for approximately 3 minutes and were executed in the following way:

  • Sitting: The subject sat on a chair in front of a desk in an upright position. Both feet were placed in parallel to each other on the floor. The arms were placed on the table.

  • Lying down: The subject laid flat, backside down on a flat surface. The arms were placed in a relaxed state to the sides of the body.

  • Standing: The subject stood upright in a comfortable position. The feet were placed in parallel to each other.

  • Walking: The subject walked in a relaxed walking gait along a sidewalk in an open air setting.

In order to synchronise the signals from the PLUX and Android sensors, a similar approach to the Synchronising Android and PLUX sensors Jupyter Notebook was employed. However, in this case the distinctive event was not generated on the z-axis of both devices but on the y-axis of both devices. The distinctive event was generated by simply jumping up and down. To ensure that the y-axes of both devices were properly aligned, at the beginning of each recording the phone was held in front of the chest in such a manner that the y-axes of both devices pointed towards the same direction (upwards). Then the subject proceeded to jump up and down to generate the distinctive event used for synchronisation. Afterwards the phone was placed in the right front pocket of the subjects trousers and the subject started to perform the activity for 3 minutes as described above.

4 - Data pre-processing

Before the signals were plotted for this Jupyter Notebook , some pre-processing steps were performed. These included:

  • Synchronisation of Android sensors: For the synchronisation of the Android sensors the same approach as described in Synchronising data from multiple Android sensor files into one file was used. The signals were synchronised using a sampling rate of 100 Hz

  • Synchronisation of PLUX and Android sensors: The synchronisation of the PLUX and Android sensors was performed as described above. As part of the synchronisation, the PLUX sensor signals were converted from RAW to physical units.

  • Cropping the Signals: The synchronised signal were then subsequently cropped to a length of 60 seconds . It was visually ensured, that the section that was cropped out was part of the performed activity.

  • Signal smoothing: Android sensor signals were smoothed using the smooth(...) function of the biosignalsnotebooks package with a hanning-window of length 30 . The smoothing was done to provide visually more distinguishable signals while still preserving all the important signal information.

Package imports and definition of some functions needed for plotting

In [1]:
# biosignalsnotebooks package
import biosignalsnotebooks as bsnb

# importing numpy
import numpy as np
In [2]:
# import gridplot and show
from bokeh.layouts import gridplot
from bokeh.plotting import show, figure

def generate_colors():
    """
    utility function for generating the same colors for all plots within this notebook. It will just be needed for this notebook only.
    
    Parameters
    ----------
    None
    
    Returns
    -------
    A color palett for each sensor
    """
    
    # generate color for all sensors
    ECG = bsnb.opensignals_color_pallet()
    SpO2_R = bsnb.opensignals_color_pallet()
    SpO2_IR = bsnb.opensignals_color_pallet()
    ACC = [bsnb.opensignals_color_pallet(), bsnb.opensignals_color_pallet(), bsnb.opensignals_color_pallet()]
    GYR = [bsnb.opensignals_color_pallet(), bsnb.opensignals_color_pallet(), bsnb.opensignals_color_pallet()]
    MAG = [bsnb.opensignals_color_pallet(), bsnb.opensignals_color_pallet(), bsnb.opensignals_color_pallet()]
    
    return [ECG, SpO2_R, SpO2_IR, ACC, GYR, MAG]

    
def plot_grid(file, color_pallet):
    """
    utility function for plotting the data for this notebook into a grid. It will just be needed for this notebook only.
    # ------------------------
    # cardioBAN data: ECG - CH1
    #                 SPO2_1 - CH9
    #                 SPO2_2 - CH10

    # android data: ACC - CH0-CH2
    #               GYR - CH3-CH5
    #               MAG - CH6-CH8
    
    Parameters
    ----------
    file (String): the file path to the .tx file which is supposed to be plotted.
    
    Returns
    -------
    No return argument
    """

    # define hanning window length
    wl = 30

    # get the time axis
    time = np.loadtxt(file)[:, 0]

    # load the data from the synchronised txt file
    # the data is loaded into a dictionary that containes two sub-dictionaries
    # first dictonary is the cardioBAN data
    # second dictionary is the android data
    data = bsnb.load(file, get_header=False)

    # extract the data from thei dictionaries
    plux_data = data[list(data.keys())[0]]
    android_data = data[list(data.keys())[1]]

    # create figures for the grid plot. The final grid plot will consist of six figures (ECG, SpO2-R, Spo2-IR, ACC, GYR,  MAG)
    # 1 ECG
    ECG = figure()
    ECG.line(list(time), list(plux_data['CH1']), color=color_pallet[0], legend_label='ECG')
    ECG.yaxis.axis_label='ECG'
    bsnb.opensignals_style([ECG]) 

    # 2 SpO2-R
    SpO2_R = figure()
    Spo2_R_color = bsnb.opensignals_color_pallet()
    SpO2_R.line(list(time), list(plux_data['CH9']), color=color_pallet[1], legend_label='SpO2-Red')
    SpO2_R.yaxis.axis_label='SpO2-Red'
    SpO2_R.legend.location = 'bottom_right'
    bsnb.opensignals_style([SpO2_R]) 

    # 3 SpO2-IR
    SpO2_IR = figure()
    Spo2_IR_color = bsnb.opensignals_color_pallet()
    SpO2_IR.line(list(time), list(plux_data['CH10']), color=color_pallet[2], legend_label='SpO2-Infrared')
    SpO2_IR.xaxis.axis_label='Time (s)'
    SpO2_IR.yaxis.axis_label='SpO2-Infrared'
    SpO2_IR.legend.location = 'bottom_right'
    bsnb.opensignals_style([SpO2_IR])

    # 4 ACC
    ACC = figure() 
    ACC.line(list(time), list(bsnb.smooth(android_data["CH0"],window_len=wl)), color=color_pallet[3][0], legend_label='X-ACC')
    ACC.line(list(time), list(bsnb.smooth(android_data["CH1"],window_len=wl)), color=color_pallet[3][1], legend_label='Y-ACC')
    ACC.line(list(time), list(bsnb.smooth(android_data["CH2"],window_len=wl)), color=color_pallet[3][2], legend_label='Z-ACC')
    ACC.yaxis.axis_label='Accelerometer'
    bsnb.opensignals_style([ACC])

    # 5 GYR
    GYR = figure()
    GYR.line(list(time), list(bsnb.smooth(android_data["CH3"],window_len=wl)), color=color_pallet[4][0], legend_label='X-GYR')
    GYR.line(list(time), list(bsnb.smooth(android_data["CH4"],window_len=wl)), color=color_pallet[4][1], legend_label='Y-GYR')
    GYR.line(list(time), list(bsnb.smooth(android_data["CH5"],window_len=wl)), color=color_pallet[4][2], legend_label='Z-GYR')
    GYR.yaxis.axis_label='Gyroscope'
    bsnb.opensignals_style([GYR])

    # 5 MAG
    MAG = figure()
    MAG.line(list(time), list(bsnb.smooth(android_data["CH6"],window_len=wl)), color=color_pallet[5][0], legend_label='X-MAG')
    MAG.line(list(time), list(bsnb.smooth(android_data["CH7"],window_len=wl)), color=color_pallet[5][1], legend_label='Y-MAG')
    MAG.line(list(time), list(bsnb.smooth(android_data["CH8"],window_len=wl)), color=color_pallet[5][2], legend_label='Z-MAG')
    MAG.yaxis.axis_label='Magnetometer'
    bsnb.opensignals_style([MAG])

    # final grid plot
    grid = gridplot([[ECG, ACC], [SpO2_R, GYR], [SpO2_IR, MAG]], **bsnb.opensignals_kwargs("gridplot"))
    grid.sizing_mode = 'scale_width'
    show(grid)
    
# generate color pallet (only once so that all plots use the same colors)
color_pallet_1 = generate_colors()
color_pallet = generate_colors()

5 - Comparing the signals of the recorded activities

Now that we have all of the important information out of the way, we can start having a look at the signals that each activity generates. In this section we will compare all of these signals and point out the visual differences that let us distinguish between each activity. As we will see, the activities will not necessarily be distinguishable based alone on the ECG and SpO2 data recorded using the PLUX sensors. However, we will see that the signals gathered from the internal Android sensors will allow us to make this distinction.

5.1 - Sitting

Below the signals that are generated from sitting are shown. Through a manual estimate, we can see that the heart rate (if we count the peaks) is around 65 bpm. The readings from the red and infrared LEDs of the SpO2 sensor follow roughly the same path, with the difference between the two being that they are slightly shifted in their intensity. This indicates that blood oxygenation levels are most likely above 90 %.

Looking at the Android sensor data, we can observe that there are almost no changes within the respective axes of each sensor.

The accelerometer shows values that are expected. The acceleration along the x-axis is above 8 m/s 2 . This makes sense because the phones x-axis is looking upwards while sitting. The fact that it does not measure exactly 9.81 m/s 2 is probably due to the circumstance that the phone is either slightly tilted forward or backward in the pocket or the subject"s thighs are not perpendicular to the chair. This tilt can also be observed in the acceleration along the y- and z-axis of the phone. Here we can see that the acceleration along the y-axis is slightly below 0, at roughly -1.6 m/s 2 and along the z-axis it is around -6.4 m/s 2 . Having all this information in mind we can thus assume that the phone is slightly tilted forward along the x-y-plane (assuming mathematical rotation) and also slightly tilted towards the leg with respect to the z-axis . It is plausible to assume that the phone is tilted along the z-axis because while sitting the fabric of the trousers is more stressed and thus the phone is pressed towards the leg.

The gyroscope data shows that the subject was sitting quite calmly during the recording. All three channels show only minor deviations from the origin.

Finally, the magnetometer indicates the same as the gyroscope. The subject did not perform major movements while seated. Since the magnetometer measures the geomagnetic field strength, the field strength does not change much when the subject doesn"t move.

In [3]:
# set file path
file = '../../images/other/activity_distinction/' + 'sitting.txt'

# plot the data that was recorderd during sitting
plot_grid(file, color_pallet)

5.2 - Lying down

Looking at the signals from when the subject is lying down, they do not look to much different from sitting at first glance. The heart rate is around 61 bpm and the SpO2 sensor returns values in a similar range.

The Android signals do not look so different either. However, on closer inspection we can distinguish some slight differences.

Taking a look at the accelerometer signals reveals that the acceleration along the y-axis is much closer to zero. Also, the acceleration around the z-axis is shifted a bit upwards (from roughly -6.4 m/s 2 to approximately -5.2 m/s 2 ). This makes sense, because when the subject is lying down, the phone is much more aligned with leg and as a consequence the surface on which the subject is lying on. However, one has to keep in mind that when the surface is tilted, so will also be the sensor reading from the phone.

The gyroscope signals do not show much difference to the ones when the subject is sitting. This is plausible, because the subject has not moved to much while performing both of these activities.

The magnetometer data also doesn"t show to much variability for the geomagnetic field along each axis. Yet, there is one difference that we can make out between both of these recordings. The geomagnetic field reading along the y-axis changed from around 11 $\mu$T to -22 $\mu$T and along the z-axis a change from roughly 2.6 $\mu$T to 9 $\mu$T can be observed. This difference, however, does unfortunately not help us with distinguishing both activities. In this case the difference gives us more a hint into which direction the subject is facing. When the subject was performing the sitting activity, the phone"s y-axis was approximately facing west, while during lying down it was facing roughly south. Please keep in mind, that the readings for the geomagnetic field may differ based on your location on the earth and can also be influenced by objects that generate their own electromagnetic field.

In [4]:
# set file path
file = '../../images/other/activity_distinction/' + 'lying.txt'

# plot the data that was recorderd during sitting
plot_grid(file, color_pallet)

5.3- Standing

The signals that are generated while standing are shown below. Here we can make a clearer distinction between the previous two activities. The heart rate slightly increased to roughly 79 bpm. A slight increase in heart rate is usually observed when changing from a lying or sitting position to a standing position. However, it is important to note that a slight increase in heart rate does not necessarily mean that a person is standing. Thus, this can not be reliably used as an indicator for distinguishing this activity. For the SpO2 sensor we still observe the same range in values as before.

The accelerometer data, however, shows a clear distinction between the acceleration along its respective axes. Mainly, it can be observed that now the acceleration along the y-axis is approximately the same value as the earths gravity (g $\approx$ 9.81 m/s 2 ), just in the negative direction. This of course makes sense because when standing the y-axis of the phone points towards the floor. The acceleration along the x- and z-axis is close to 1 or -1 because both of these axes are nearly perpendicular to the floor, with some minor shifts due to the position of the phone within the front pocket of the subject"s trousers.

We can also see a higher variability within the gyroscope signals. This is due to the fact that when someone is standing in an upright position (assuming they are not leaning against a wall or are supported by something), the person usually tends to teeter slightly (i.e. moving the upper body back and forth and to the sides).

The same as for the last two activities can be said about the magnetometer for this activity. Since the subject is not moving all to much, the readings can not be directly used to distinguish the activity. The information that can be retrieved during activities in which the subject does not move much are more in a sense of orientation in relation to the earths magnetic field.

In [5]:
# set file path
file = '../../images/other/activity_distinction/' + 'standing.txt'

# plot the data that was recorderd during sitting
plot_grid(file, color_pallet)

5.4- Walking

Finally, walking does clearly show a stark difference to the other three activities. Differences can be seen directly in most of the signals. The ECG clearly shows distortions in the signals. These distortions are movement artifacts that are a result of the dry electrodes slightly moving during walking. The heart rate is approximately at 83 bpm. The SpO2 sensor signals are still in the same range as before.

Data from the Android sensors have a periodicity that can be clearly observed. It is reasonable to assume that the periodicity is directly linked to the speed at which steps are taken during walking. Thus, it can not only be observed that the subject is walking but it should also be possible to make an estimation on how many steps the subject has done during the recording. From this an estimate on the distance and velocity that the subject walked could also be made (assuming a fixed step distance).

In [6]:
# set file path
file = '../../images/other/activity_distinction/' + 'walking.txt'

# plot the data that was recorderd during sitting
plot_grid(file, color_pallet)

6 - Activity distinction a perfect problem for machine learning

Doing a visual examination of the data and then trying to distinguish when a certain activity starts or ends is of course a task that is far to time consuming and exhausting. However, here machine learning would be a perfect tool to solve the task. Assuming one collects enough data, the sensor data could be used to build a classifier. This classifier should then be able to offer an accurate and reliable activity tracking using the data that the patient records throughout the recording process.

Activity distinction or, to name it in machine learning terms, activity recognition is a problem that is still researched in the scientific community. There exist several papers that try to tackle this problem. We recommend reading the following papers to get a good overview of the topic and also get knowledge on different approaches.

Of course, when tackling this problem one does not have to restrict oneself to only using the accelerometer, gyroscope, and magnetometer. The Android system does offer a great variety of other sensors. There would be possibilities to also use the light and proximity sensors in order to distinguish when the patient/subject is pulling the phone out of the pocket. Integration of GPS data could be used to distinguish between sitting in front of a desk, in a car, a train, etc. The OpenSignals mobile application also allows you to include biosignals, such EMG, ECG, EMG, fNIRS, and many more into your model. The possibilities are basically endless!

In this Jupyter notebook we showed that Android sensors can be leveraged to distinguish between different activities such as sitting, lying down, standing, and walking.

We hope that you have enjoyed this guide . biosiganlsnotebooks is an environment in continuous expansion, so don"t stop your journey and learn more with the remaining Notebooks .

In [7]:
from biosignalsnotebooks.__notebook_support__ import css_style_apply
css_style_apply()
.................... CSS Style Applied to Jupyter Notebook .........................
Out[7]: